|
In vector calculus, the Jacobian matrix (, ) is the matrix of all first-order partial derivatives of a vector-valued function. When the matrix is a square matrix, both the matrix and its determinant are referred to as the Jacobian in literature.〔(Mathworld )〕 Suppose is a function which takes as input the vector and produces as output the vector . Then the Jacobian matrix of is an matrix, usually defined and arranged as follows: : or, component-wise: : This matrix, whose entries are functions of , is also denoted by , , and . (Note that some literature defines the Jacobian as the transpose of the matrix given above.) The Jacobian matrix is important because if the function is differentiable at a point (this is a slightly stronger condition than merely requiring that all partial derivatives exist there), then the Jacobian matrix defines a linear map , which is the best linear approximation of the function near the point . This linear map is thus the generalization of the usual notion of derivative, and is called the ''derivative'' or the ''differential'' of at . If = , the Jacobian matrix is a square matrix, and its determinant, a function of , is the Jacobian determinant of . It carries important information about the local behavior of . In particular, the function has locally in the neighborhood of a point an inverse function that is differentiable if and only if the Jacobian determinant is nonzero at (see Jacobian conjecture). The Jacobian determinant occurs also when changing the variables in multi-variable integrals (see substitution rule for multiple variables). If = 1, is a scalar field and the Jacobian matrix is reduced to a row vector of partial derivatives of —i.e. the gradient of . These concepts are named after the mathematician Carl Gustav Jacob Jacobi (1804–1851). == Jacobian matrix == The Jacobian generalizes the gradient of a scalar-valued function of multiple variables, which itself generalizes the derivative of a scalar-valued function of a single variable. In other words, the Jacobian for a scalar-valued multivariable function is the gradient and that of a scalar-valued function of single variable is simply its derivative. The Jacobian can also be thought of as describing the amount of "stretching", "rotating" or "transforming" that a transformation imposes locally. For example, if is used to transform an image, the Jacobian , describes how the image in the neighborhood of is transformed. If a function is differentiable at a point, its derivative is given in coordinates by the Jacobian, but a function doesn't need to be differentiable for the Jacobian to be defined, since only the partial derivatives are required to exist. If is a point in and is differentiable at , then its derivative is given by . In this case, the linear map described by is the best linear approximation of near the point , in the sense that : for close to and where is the little o-notation (for ) and is the distance between and . Compare this to a Taylor series for a scalar function of a scalar argument, truncated to first order: : In a sense, both the gradient and Jacobian are "first derivatives"—the former the first derivative of a ''scalar function'' of several variables, the latter the first derivative of a ''vector function'' of several variables. The Jacobian of the gradient of a scalar function of several variables has a special name: the Hessian matrix, which in a sense is the "second derivative" of the function in question. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Jacobian matrix and determinant」の詳細全文を読む スポンサード リンク
|